Last updated over 1 year ago. What is this?

Existential risk refers to any event or series of events that could cause the irreversible destruction of humanity's potential, either by leading to human extinction or by dramatically and permanently curtailing our future development. The term encompasses both natural and anthropogenic threats, including but not limited to, runaway artificial intelligence, nuclear war, catastrophic climate change, biotechnological misadventures, and self-replicating nanotechnology. These risks demand urgent and comprehensive attention because their realization represents the loss of not just millions of lives, but the annihilation of all possible future lives and achievements. Addressing existential risks effectively requires unprecedented levels of global collaboration, systemic thinking, and preventive measures, pooling multidisciplinary insights to ensure the long-term flourishing of humanity and the biosphere.

See also: catastrophic risk, existential threat, nuclear weapon, permaculture, mutually assured destruction

Daniel Schmachtenberger: Steering Civilization Away from Self-Destruction | Lex Fridman Podcast #191 452,171

Daniel Schmachtenberger on The Portal (with host Eric Weinstein), Ep. #027 - On Avoiding Apocalypses 350,722

DarkHorse Podcast with Daniel Schmachtenberger & Bret Weinstein 285,486

War on Sensemaking V, Daniel Schmachtenberger 135,606

Saving Civilization: Healthcare, Tech, Democracy (w/Daniel Schmachtenberger) 86,669

rp daily: a conversation that will blow your mind with Daniel Schmachtenberger 68,996

Human civilization is a self-terminating system | Daniel Schmachtenberger and Lex Fridman 64,696

In Search of the Third Attractor, Daniel Schmachtenberger (part 1) 59,879

Daniel Schmachtenberger “Bend Not Break Part 1: Energy Blindness” | The Great Simplification #05 50,646